An Anniversary of Destruction, Loss, and Bravery in Ukraine - Ukrainians have responded with remarkable dignity and courage, but there is little to romanticize one year into the Russian invasion. - link
How the Government Cancelled Betty Ann’s Debts - For a ninety-one-year-old law-school graduate, the Department of Education discharged more than three hundred thousand dollars in student debt. Could relief be that simple? - link
What Is Ron DeSantis Doing to Florida’s Public Liberal-Arts College? - DeSantis is not simply inveighing against progressive control of institutions. He is using his powers as governor to remake them. - link
Maria Pevchikh, Putin’s Grand Inquisitor - A deputy to Alexey Navalny discusses his near-fatal poisoning, her own probe of Kremlin corruption, and battling Moscow from exile. - link
Greening the Burial of the Dead, in Brooklyn - The historic Green-Wood Cemetery—the final resting place of Leonard Bernstein and half a million others—explores a cutting-edge method of processing human remains: electric cremation. - link
It’s easy to give a bad apology. Here’s how to give a good one.
If you can’t remember the last time you apologized: congratulations, you are perfect — or at least you believe you are. For the rest of us, apologizing is a common, if difficult, part of life.
Among the earliest lessons imparted to children is the art of saying sorry, yet these skills don’t always transfer neatly to adulthood. Relationships are messy and both parties often have some level of culpability. However, the biggest obstacle to apologetic bliss isn’t a complicated argument — it’s self-protective motivations.
Good apologies are notoriously hard to come by, partly because of an inherent resistance to making them in the first place. People are hesitant to apologize because they falsely believe it affects how outsiders perceive them, says Amy Ebesu Hubbard, a professor at the University of Hawaii Manoa School of Communication and Information. Some view apologizing as admitting defeat and thereby lowering their social status; others think it tarnishes their reputation. On the contrary, a successful apology can bring people closer together and can improve the apologizer’s standing with the receiver, Hubbard says.
There are a number of other psychological barriers preventing people from apologizing, according to Karina Schumann, a professor of psychology at the University of Pittsburgh. Chief among them is a desire to see yourself as a good person — and for others to consider you morally just, too. When someone is upset with you, it’s common to shift into self-protection mode and to trick yourself into believing you didn’t do anything wrong. “A lot of the time, people don’t apologize simply because these self-defensive processes kick in and they come up with all kinds of reasons why they shouldn’t apologize,” Schumann says. “They push blame onto the other person, they think of excuses, all the situational factors that caused them to behave the way they did.” Another impediment to apologizing can be a lack of empathy or concern for the relationship with the wronged party.
Saying sorry effectively boils down to a few simple steps that can be easily replicated and adapted to different situations, from accidentally bumping into a stranger in a crowded bar to insulting the entirety of your best friend’s life choices. The key to successful apologies doesn’t lie in following a formula, though: It’s true sincerity.
According to Marjorie Ingall and Susan McCarthy, the authors of the book Sorry, Sorry, Sorry: The Case for Good Apologies, successful apologies contain six (and a half) components:
The half-step is to listen to the person or people (these steps work regardless if you’re apologizing to one person or a group) you’ve wronged. This is about their experience and emotions, not yours.
“They’re more or less ranked in the order of importance,” McCarthy says. This isn’t to say listening is the least important, but sometimes the hurt person may not want to extend the conversation beyond hearing you say sorry.
Each component can be adjusted to fit the seriousness of the apology. You don’t need to explain what you’re doing to better yourself after accidentally stealing your neighbor’s trash can. But you’ll want to show you understand why punching a wall in a rage is not healthy.
Saying the words “I’m sorry” or “I apologize” is non-negotiable in any decent apology, big or small. Avoid terms like “I regret” or “I feel really bad about what happened.”
For bigger infractions, explicitly saying what you’re apologizing for and why it was wrong helps you take accountability. Be specific and use active language. Think: “I’m sorry I accused your sister of stealing money. It was crappy of me to make assumptions based on judgments.” and not “I regret the events that occurred which caused you to feel upset” or the other gobbledygook commonly found in brand, YouTuber, and notes-app apologies. “If you just dropped a cup of water, you don’t need to explain to that person,” McCarthy says. “But in most cases, it’s really good to specify.”
Even if you aren’t sure why someone is angry with you — but you know they are — apologize for what you can, Hubbard says; that might sound like “I can see that you’re upset with me and I’m very sorry for hurting you.” Piecemeal apologies also apply to situations where you’re being told to say sorry even if you feel you were justified in your actions. Ingall recalls a situation in which her child was asked to apologize for yelling at another student after they were provoked by a bully. “I felt like Max was 100 percent the wronged party and only reacted,” Ingall says. “We figured out that Max could say, ‘I’m sorry for disrupting the class.’”
Explaining why you acted the way you did can add important context, Schumann says. Victims of wrongdoing often see the transgression as purposeful, unfair, and intentional, according to research. On the other hand, the wrongdoers tend to see their actions as provoked and justified. A non-defensive account of your motivations can help the person you’re apologizing to see that you weren’t acting maliciously. Schumann suggests saying something like, “I want to let you know why my behavior has been like this over the past few weeks just to help you understand where it was coming from. It’s no excuse and I should’ve done better.” Be careful to not make excuses, Ingall stresses. In their book, Ingall and McCarthy write that “I didn’t mean to,” “Some things just fell through the cracks,” or “I knew you’d never understand” are all common excuses.
Describe how you’ll never make the same infraction again with specificity: “I’ll set a reminder in my phone next time so I don’t forget,” “I won’t use that language anymore,” “I’m going to therapy.” It’s not enough to say “I’m taking responsibility for my actions.” How will you take responsibility?
While not applicable in all situations, making up for a bad deed can look like offering to buy a new white rug after you spilled red wine all over it, or publicly correcting the record regarding the embarrassing claims you made about a friend.
These intense and personal apologies are what researcher Yohsuke Ohtsubo calls “costly apologies,” where the wrongdoer is willing to do whatever it takes to repair the relationship. Victims perceive these apologies as being more sincere because they know “that you value the relationship with them more than the cost you pay,” says Ohtsubo, a professor at the University of Tokyo, “which also informs them that you are not likely to do the same transgression again.” The “cost” incurred has less to do with monetary value but instead is focused on the worth of the relationship.
There are a few hallmarks of a bad apology. Ingall and McCarthy suggest avoiding language like “Sorry if …” (“Sorry if you were offended”), “Sorry but …” (“Sorry, but I had every right to yell”), and “Sorry you …” (“I’m sorry you took that the wrong way”). Don’t include words like “obviously,” “regrettable,” and “unfortunate” either.
Any statement that confers blame on the recipient is a bad apology. “It’s very normal for us to want to point out how they’ve hurt us as well,” Schumann says, “because oftentimes these things aren’t clean-cut in terms of who hurt who.” If you feel like you are also owed an apology, save that for a separate conversation.
By apologizing, you acknowledge your words and actions have caused pain — so don’t minimize the other person’s hurt in order to assuage your ego. “It was just a joke,” “I didn’t mean anything by it,” or “I don’t know why it was such a big deal” are bound to make the other person feel worse, Schumann says.
More important than the timing and means of your apology is its sincerity, Hubbard says. If you’re not ready to say sorry and mean it, you can apologize multiple times, Hubbard says: Once to clear the air of any awkwardness, and later when you truly feel repentant.
Don’t worry about where the apology lives within the conversation — focus on being sincere and empathetic instead. A commonly cited study found that when apologies came after the wronged party had a chance to share their feelings, they were more effective. One of Hubbard’s studies showed that starting a conversation with an apology can springboard a deeper conversation. Whenever you apologize, be prepared for any range of emotions, and to listen (or for the other person to disengage completely).
In general, the most sincere apologies take place face-to-face or over the phone. The other person can hear your voice, your tone, and read your body language. Text apologies can be utilized if you typically interact with the person you’ve hurt that way. Messages on social media can be an effective way to apologize to someone from your past you don’t communicate with or see in person. Mass apologies on social media should be avoided at all costs.
“It is far healthier to reach out with your actual human voice to your friends who you have actually harmed and say, ‘I’m sorry, I love you, I miss you. Can we talk about this?’” Ingall says. “You will find that to be endlessly more fulfilling than the Notes app apology that, B-T-dubs, everybody ends up messing up anyway.”
There are seemingly endless situations calling for an apology — plenty of ways to screw up, piss people off, or offend — but a few circumstances when you don’t need to change a thing. Women and girls, who are famously maligned for apologizing too frequently, should stop apologizing for apologizing, Ingall says. “We have to be really careful about not over-policing women’s speech and not telling women that the way they talk — whether that’s vocal fry, or rise in inflection at the end of the sentence, or apologizing — is wrong,” she says, “because sometimes there are things we just got to do to make it through the day and to make our life easier.”
Never apologize for existing, taking space, and living your authentic self. That’s the version of unapologetic worth aspiring to.
“It’s appropriate to apologize for things that you do or say,” McCarthy says. “You don’t have to apologize for who you are.”
From ELIZA onwards, humans love their digital reflections.
It didn’t take long for Microsoft’s new AI-infused search engine chatbot — codenamed “Sydney” — to display a growing list of discomforting behaviors after it was introduced early in February, with weird outbursts ranging from unrequited declarations of love to painting some users as “enemies.”
As human-like as some of those exchanges appeared, they probably weren’t the early stirrings of a conscious machine rattling its cage. Instead, Sydney’s outbursts reflect its programming, absorbing huge quantities of digitized language and parroting back what its users ask for. Which is to say, it reflects our online selves back to us. And that shouldn’t have been surprising — chatbots’ habit of mirroring us back to ourselves goes back way further than Sydney’s rumination on whether there is a meaning to being a Bing search engine. In fact, it’s been there since the introduction of the first notable chatbot almost 50 years ago.
In 1966, MIT computer scientist Joseph Weizenbaum released ELIZA (named after the fictional Eliza Doolittle from George Bernard Shaw’s 1913 play Pygmalion), the first program that allowed some kind of plausible conversation between humans and machines. The process was simple: Modeled after the Rogerian style of psychotherapy, ELIZA would rephrase whatever speech input it was given in the form of a question. If you told it a conversation with your friend left you angry, it might ask, “Why do you feel angry?”
Ironically, though Weizenbaum had designed ELIZA to demonstrate how superficial the state of human-to-machine conversation was, it had the opposite effect. People were entranced, engaging in long, deep, and private conversations with a program that was only capable of reflecting users’ words back to them. Weizenbaum was so disturbed by the public response that he spent the rest of his life warning against the perils of letting computers — and, by extension, the field of AI he helped launch — play too large a role in society.
ELIZA built its responses around a single keyword from users, making for a pretty small mirror. Today’s chatbots reflect our tendencies drawn from billions of words. Bing might be the largest mirror humankind has ever constructed, and we’re on the cusp of installing such generative AI technology everywhere.
But we still haven’t really addressed Weizenbaum’s concerns, which grow more relevant with each new release. If a simple academic program from the ’60s could affect people so strongly, how will our escalating relationship with artificial intelligences operated for profit change us? There’s great money to be made in engineering AI that does more than just respond to our questions, but plays an active role in bending our behaviors toward greater predictability. These are two-way mirrors. The risk, as Weizenbaum saw, is that without wisdom and deliberation, we might lose ourselves in our own distorted reflection.
Weizenbaum did not believe that any machine could ever actually mimic — let alone understand — human conversation. “There are aspects to human life that a computer cannot understand — cannot,” Weizenbaum told the New York Times in 1977. “It’s necessary to be a human being. Love and loneliness have to do with the deepest consequences of our biological constitution. That kind of understanding is in principle impossible for the computer.”
That’s why the idea of modeling ELIZA after a Rogerian psychotherapist was so appealing — the program could simply carry on a conversation by asking questions that didn’t require a deep pool of contextual knowledge, or a familiarity with love and loneliness.
Named after the American psychologist Carl Rogers, Rogerian (or “person-centered”) psychotherapy was built around listening and restating what a client says, rather than offering interpretations or advice. “Maybe if I thought about it 10 minutes longer,” Weizenbaum wrote in 1984, “I would have come up with a bartender.”
To communicate with ELIZA, people would type into an electric typewriter that wired their text to the program, which was hosted on an MIT system. ELIZA would scan what it received for keywords that it could flip back around into a question. For example, if your text contained the word “mother,” ELIZA might respond, “How do you feel about your mother?” If it found no keywords, it would default to a simple prompt, like “tell me more,” until it received a keyword that it could build a question around.
Weizenbaum intended ELIZA to show how shallow computerized understanding of human language was. But users immediately formed close relationships with the chatbot, stealing away for hours at a time to share intimate conversations. Weizenbaum was particularly unnerved when his own secretary, upon first interacting with the program she had watched him build from the beginning, asked him to leave the room so she could carry on privately with ELIZA.
Shortly after Weizenbaum published a description of how ELIZA worked, “the program became nationally known and even, in certain circles, a national plaything,” he reflected in his 1976 book, Computer Power and Human Reason.
To his dismay, the potential to automate the time-consuming process of therapy excited psychiatrists. People so reliably developed emotional and anthropomorphic attachments to the program that it came to be known as the ELIZA effect. The public received Weizenbaum’s intent exactly backward, taking his demonstration of the superficiality of human-machine conversation as proof of its depth.
Weizenbaum thought that publishing his explanation of ELIZA’s inner functioning would dispel the mystery. “Once a particular program is unmasked, once its inner workings are explained in language sufficiently plain to induce understanding, its magic crumbles away,” he wrote. Yet people seemed more interested in carrying on their conversations than interrogating how the program worked.
If Weizenbaum’s cautions settled around one idea, it was restraint. “Since we do not now have any ways of making computers wise,” he wrote, “we ought not now to give computers tasks that demand wisdom.”
If ELIZA was so superficial, why was it so relatable? Since its responses were built from the user’s immediate text input, talking with ELIZA was basically a conversation with yourself — something most of us do all day in our heads. Yet here was a conversational partner without any personality of its own, content to keep listening until prompted to offer another simple question. That people found comfort and catharsis in these opportunities to share their feelings isn’t all that strange.
But this is where Bing — and all large language models (LLMs) like it — diverges. Talking with today’s generation of chatbots is speaking not just with yourself, but with huge agglomerations of digitized speech. And with each interaction, the corpus of available training data grows.
LLMs are like card counters at a poker table. They analyze all the words that have come before and use that knowledge to estimate the probability of what word will most likely come next. Since Bing is a search engine, it still begins with a prompt from the user. Then it builds responses one word at a time, each time updating its estimate of the most probable next word.
Once we see chatbots as big prediction engines working off online data — rather than intelligent machines with their own ideas — things get less spooky. It gets easier to explain why Sydney threatened users who were too nosy, tried to dissolve a marriage, or imagined a darker side of itself. These are all things we humans do. In Sydney, we saw our online selves predicted back at us.
But what is still spooky is that these reflections now go both ways.
From influencing our online behaviors to curating the information we consume, interacting with large AI programs is already changing us. They no longer passively wait for our input. Instead, AI is now proactively shaping significant parts of our lives, from workplaces to courtrooms. With chatbots in particular, we use them to help us think and give shape to our thoughts. This can be beneficial, like automating personalized cover letters (especially for applicants where English is a second or third language). But it can also narrow the diversity and creativity that arises from the human effort to give voice to experience. By definition, LLMs suggest predictable language. Lean on them too heavily, and that algorithm of predictability becomes our own.
If ELIZA changed us, it was because simple questions could still prompt us to realize something about ourselves. The short responses had no room to carry ulterior motives or push their own agendas. With the new generation of corporations developing AI technologies, the change is flowing both ways, and the agenda is profit.
Staring into Sydney, we see many of the same warning signs that Weizenbaum called attention to over 50 years ago. These include an overactive tendency to anthropomorphize and a blind faith in the basic harmlessness of handing over both capabilities and responsibilities to machines. But ELIZA was an academic novelty. Sydney is a for-profit deployment of ChatGPT, which is a $29 billion dollar investment, and part of an AI industry projected to be worth over $15 trillion globally by 2030.
The value proposition of AI grows with every passing day, and the prospect of realigning its trajectory fades. In today’s electrified and enterprising world, AI chatbots are already proliferating faster than any technology that came before. This makes the present a critical time to look into the mirror that we’ve built, before the spooky reflections of ourselves grow too large, and ask whether there was some wisdom in Weizenbaum’s case for restraint.
As a mirror, AI also reflects the state of the culture in which the technology is operating. And the state of American culture is increasingly lonely.
To Michael Sacasas, an independent scholar of technology and author of The Convivial Society newsletter, this is cause for concern above and beyond Weizenbaum’s warnings. “We anthropomorphize because we do not want to be alone,” Sacasas recently wrote. “Now we have powerful technologies, which appear to be finely calibrated to exploit this core human desire.”
The lonelier we get, the more exploitable by these technologies we become. “When these convincing chatbots become as commonplace as the search bar on a browser,” Sacases continues, “we will have launched a social-psychological experiment on a grand scale which will yield unpredictable and possibly tragic results.”
We’re on the cusp of a world flush with Sydneys of every variety. And to be sure, chatbots are among the many possible implementations of AI that can deliver immense benefits, from protein-folding to more equitable and accessible education. But we shouldn’t let ourselves get so caught up that we neglect to examine the potential consequences. At least until we better understand what it is that we’re creating, and how it will, in turn, recreate us.
Trump took the stage at CPAC, blasted his own party and declared that “I am your retribution.”
Donald Trump is many things, but whatever he is, it’s not a Reagan Republican. Speaking in one of his trademark discursive speeches to the Conservative Political Action Conference (CPAC) on Saturday night, Trump made clear how much his politics diverged from the mold that had defined the Republican Party for generations before he took that infamous ride down an escalator in 2015 and announced he was running for president.
Even if Trump hadn’t tipped his hand when he declared early in his remarks to a mostly full ballroom of diehards in MAGA hats that “we are never going back to the party of Paul Ryan, Karl Rove and Jeb Bush,” the rest of his speech represented a fundamental repudiation that era of the Republican Party. But more than that, it represented a reversion toward a pre-World War II GOP, with doses of both populism and paleo conservatism.
Perhaps the most jarring change from the past was Trump’s derision of U.S. aid to Ukraine, just days after as the Eastern European country marked the one-year anniversary of Russia’s unprovoked invasion. For over a half-century, hawkish interventionist foreign policy—-especially towards Russia—had been one of the fundamental principles of the Republican Party. Trump’s election, especially given the questions about Russia’s efforts to sway the 2016 presidential race, put this into question. But Trump’s speech, which followed harsh attacks on Ukrainian President Volodymyr Zelensky throughout the three-day conference from speakers like Rep. Marjorie Taylor Greene (R-Georgia), made it clear how severely that the GOP had shifted towards isolationism in recent years.
In his CPAC speech, Trump compared foreign aid, like that which over $75 billion that the Biden Administration provided Ukraine, to a business investment which should be rewarded with an equity stake. “In business, you put up the money, seed money . . . you end up owning the country by the time it’s over.” At another point in the speech, he suggested that U.S. foreign aid to countries should be tied to preferential tariff treatment.
He paired this with a grim view of the United States, rooted in “the American carnage” which defined his 2017 inaugural speech that pitted his supporters against shadowy elites—including the “Marxists” he derided in his remarks. For his supporters, he declared “I am your warrior, I am your Justice, and for those who have been wronged and betrayed, I am your retribution,” as he pledged to “eradicate the Deep State,” a group that he blamed so much of his personal ills as well as those of his supporters.
Trump’s populist appeal was not just rooted in the paranoid style of American politics that had once defined much of the American right, it also included a jibe at those fiscal conservatives who have long wanted to cut entitlement spending. “We’re not going back to people that want to destroy our great Social Security system,” he said in a veiled attack at likely rival Gov. Ron DeSantis (R-Florida), who backed Paul Ryan’s budgets while serving in the House of Representatives.
There were still some familiar social conservative elements in Trump’s remarks, though they were transmuted into the modern Republican coalition from the debates of yesteryear. While abortion was rarely mentioned on stage at CPAC and gay marriage seemed almost as archaic a topic of political debate as aid to the contras, transgender issues provoked perhaps Trump’s most fervent applause. Trump said, if elected, he would sign a bill banning sex change procedures for minors, which he characterized as “chemical castration and genital mutilation,” and he received a standing ovation from the ballroom.
The speech felt like yet another milestone for the Republican Party, not just as the conference embraced Trumpism, but as the American right embraced a more continental conservatism. Trump spoke only hours after former Brazilian president Jair Bolsonaro’s speech, and Brazilian flags could be spotted throughout the crowd, alternating in patches with the Stars and Stripes and, above all, red MAGA hats. CPAC has increasingly embraced the global right—-holding a pro-Viktor Orban event in Hungary last year, and partnering with those who minimized and denied war crimes in World War II in Japan. Not all of this is foreign to American politics—after all, the America First slogan was first used by the isolationists who railed against the United States supporting the Allies in World War II before Pearl Harbor. But this strain of politics had remained submerged on the right, popping up in Pat Buchanan’s speeches and Ron Paul’s newsletters. That’s not the case anymore. The question is just how dominant it will be in 2024 and moving forward.
A mystery on the minefield: what ails Axar’s bowling? - In this series he has sent down 39 overs for one wicket at an average of 103 compared to the pre-series stats of 47 wickets in eight Tests at 14.29
Silver for Srivatsa-Rizwan Abu pair -
Rest of India retains Irani Cup - Nothing goes right for the home team on final day as it is bowled out for 198 and concedes a 238-run defeat
Women Premier League 2023 | Shafali Verma, Meg Lanning demolish RCB, propel DC - Due to Shafali Verma and Meg Lanning belligerent half-centuries, Delhi Capitals posted 223 for 2.
Sania Mirza ends her career at place where it began - Sania Mirza, who turned emotional while giving her farewell speech, said the greatest honour for her has been to play for the country for 20 years.
Freight wagons get separated while running -
Arrangements for Attukal Pongala in final stages - Corpn. has taken steps to complete road works, provide drinking water, says Mayor. Funds for cleaning, waste management
NDC calls for halt to construction at St. Stephen’s Church -
Rare images of India’s first R-Day celebrations, Constituent Assembly meetings on display at Book Fair - Archival pictures and reports, drawn from the Parliament Museum and Archives, offering a glimpse of these jubilant scenes are on display at an exhibition at the ongoing New Delhi World Book Fair
Admission of children in private unaided schools under quota system to begin in Andhra Pradesh on March 6 - School Education Department releases schedule of the process; school managements warned against charging more than the fee prescribed by the government
Bakhmut: Fighting in the street but Russia not in control - deputy mayor - The city has seen months of intense fighting - despite its strategic value being questioned.
Ukraine war: The Moldovan enclave surrounded by pro-Russian forces - Residents in the tiny Moldovan enclave of Molovata Noua fear the Ukraine war spilling over.
Sergei Lavrov: Russian foreign minister laughed at for Ukraine war claims - Sergei Lavrov is laughed at in Delhi after saying the Ukraine war was “launched against us”.
US-made cheese can also be called ‘gruyere’, court rules - A US appeal court has ruled that cheeses from outside Switzerland and France can be called “gruyere”.
Ales Bialiatski: Nobel Prize-winning activist sentenced to 10 years in jail - Nobel Prize winner Ales Bialiatski was accused of smuggling cash into Belarus to fund protests.
The sketchy plan to build a Russian Android phone - A Russian tech giant plans to launch new Android phones and tablets. - link
Do masks work? It’s a question of physics, biology, and behavior - A recent review from a prominent scientific source has reignited the debate over masks. - link
Feast your eyes on this image of remnant from earliest recorded supernova - Dark Energy Camera captures rare view of RCW 86, remnant of supernova recorded in 185 CE. - link
Measles exposure at massive religious event in Kentucky spurs CDC alert - Kentucky has one of the lowest vaccination rates among kindergartners in the country. - link
AI-powered Bing Chat gains three distinct personalities - Bing Chat is no longer unhinged, but it can hallucinate more if you want it to. - link
A little test for you. -
This test will predict which of the 18 films listed below is your favourite. Don’t ask me how, but it really works!
Don’t cheat and look at the film list till you have done the maths!
Here goes…
Film Test:
Pick a number from 1-9.
Multiply by 3.
Add 3.
Multiply by 3 again.
Now add the two digits together to find your predicted favourite film in the list of 18 films below.
Film List:
submitted by /u/Buddy2269
[link] [comments]
what do you call a guy with 15 and a half rabbits up his bum? -
Kyle. My names Kyle.
submitted by /u/GodzeallA
[link] [comments]
A beautiful young New York woman was so depressed…. -
….that she decided to end her life by throwing herself into the ocean.
But just before she could throw herself from the docks, a handsome young sailor stopped her.
“You have so much to live for,” said the sailor. “Look, I’m off to Europe tomorrow and I can stow you away on my ship. I’ll take care of you, bring you food every day, and keep you happy.”
With nothing to lose, the woman accepted. That night the sailor brought her aboard and hid her in a lifeboat. From then on, every night he would bring her three sandwiches and make love to her until dawn.
Three weeks later she was discovered by the captain during a routine inspection.“What are you doing here?” asked the captain.
“I have an arrangement with one of the sailors,” she replied. “He brings me food and I get a free trip to Europe. Plus he’s screwing me.”
“He certainly is,” replied the captain. “This is the Staten Island Ferry.”
submitted by /u/ODaferio
[link] [comments]
A Sith, a Jedi, and a Mandalorian walk into a bar… -
They start talking and after a few drinks the conversation shifts to cars. The Jedi living a life of austerity and frugality only has a 1991 Camry. The Sith and Mando laughs at him saying he has a Bad Car.
The Sith having manipulated others into giving him their wealth shows off his McClaren F1. The patrons at the bar are amazed and even the Jedi has to admit it’s a nice ride. They both end up saying it’s a Good Car.
The Mandalorian walks around the corner and after a few minutes comes screaming back on his jet pack and blows up the other cars. He has the Beskar.
submitted by /u/FunnyCarp
[link] [comments]
My girlfriend got a boob job, but I don’t know how to break it to her that I find it makes her less attractive -
Traditionally women tend to get both done
submitted by /u/Purpoisely_Anoying_U
[link] [comments]